# GPT2 architecture
Uzmi Gpt
Apache-2.0
GPT-2 is an open-source language model developed by OpenAI, based on the Transformer architecture, capable of generating coherent text.
Large Language Model English
U
rajan3208
30
2
Sinhala Gpt2
This is a small GPT2 model trained on the MC4 Sinhala language dataset, suitable for Sinhala text generation tasks.
Large Language Model Other
S
keshan
19
1
Gpt2 Base Chinese
Gpl-3.0
A Traditional Chinese GPT2 model developed by Academia Sinica's CKIP team, suitable for Chinese text generation tasks.
Large Language Model Chinese
G
ckiplab
1,680
30
Gpyt
MIT
GPyT is a GPT2 model trained from scratch on 80GB of pure Python code, focusing on Python code generation tasks.
Large Language Model
Transformers Other

G
Sentdex
21
22
Gpt2 Base Bne
Apache-2.0
This is a Spanish language model based on the GPT-2 architecture, trained using web-crawled data from the Spanish National Library between 2009 and 2019.
Large Language Model
Transformers Spanish

G
PlanTL-GOB-ES
4,049
12
Featured Recommended AI Models